Sampling Algorithms and Coresets for lp Regression
نویسندگان
چکیده
The lp regression problem takes as input a matrix A ∈ R , a vector b ∈ R, and a number p ∈ [1,∞), and it returns as output a number Z and a vector xopt ∈ R such that Z = minx∈Rd ‖Ax− b‖p = ‖Axopt − b‖p. In this paper, we construct coresets and obtain an efficient two-stage sampling-based approximation algorithm for the very overconstrained (n ≫ d) version of this classical problem, for all p ∈ [1,∞). The first stage of our algorithm non-uniformly samples r̂1 = O(36d) rows of A and the corresponding elements of b, and then it solves the lp regression problem on the sample; we prove this is an 8-approximation. The second stage of our algorithm uses the output of the first stage to resample r̂1/ǫ 2 constraints, and then it solves the lp regression problem on the new sample; we prove this is a (1 + ǫ)-approximation. Our algorithm unifies, improves upon, and extends the existing algorithms for special cases of lp regression, namely p = 1, 2 [10, 13]. In course of proving our result, we develop two concepts—well-conditioned bases and subspace-preserving sampling—that are of independent interest.
منابع مشابه
Sampling algorithms and coresets for ℓp regression
The lp regression problem takes as input a matrix A ∈ R, a vector b ∈ R, and a number p ∈ [1,∞), and it returns as output a numberZ and a vector xOPT ∈ R such thatZ = minx∈Rd ‖Ax− b‖p = ‖AxOPT − b‖p. In this paper, we construct coresets and obtain an efficient two-stage sampling-based approximation algorithm for the very overconstrained (n ≫ d) version of this classical problem, for all p ∈ [1,...
متن کاملSampling Algorithms and Coresets
The p regression problem takes as input a matrix A ∈ Rn×d, a vector b ∈ Rn, and a number p ∈ [1,∞), and it returns as output a number Z and a vector xopt ∈ Rd such that Z = minx∈Rd ‖Ax− b‖p = ‖Axopt − b‖p. In this paper, we construct coresets and obtain an efficient two-stage sampling-based approximation algorithm for the very overconstrained (n d) version of this classical problem, for all p ∈...
متن کاملRich Coresets For Constrained Linear Regression
A rich coreset is a subset of the data which contains nearly all the essential information. We give deterministic, low order polynomial-time algorithms to construct rich coresets for simple and multiple response linear regression, together with lower bounds indicating that there is not much room for improvement upon our results.
متن کاملA Unified Approach for Design of Lp Polynomial Algorithms
By summarizing Khachiyan's algorithm and Karmarkar's algorithm forlinear program (LP) a unified methodology for the design of polynomial-time algorithms for LP is presented in this paper. A key concept is the so-called extended binary search (EBS) algorithm introduced by the author. It is used as a unified model to analyze the complexities of the existing modem LP algorithms and possibly, help ...
متن کاملSubgradient and Sampling Algorithms for `1 Regression
Given an n× d matrix A and an n-vector b, the `1 regression problem is to find the vector x minimizing the objective function ‖Ax−b‖1, where ‖y‖1 ≡ ∑ i |yi| for vector y. This paper gives an algorithm needing O(n log n)d time in the worst case to obtain an approximate solution, with objective function value within a fixed ratio of optimum. Given > 0, a solution whose value is within 1 + of opti...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007